Showing posts with label Emotional and Mental Well Being. Show all posts
Showing posts with label Emotional and Mental Well Being. Show all posts

4/15/2013

How stress can boost the immune system


The study's findings provide a thorough overview of how a triad of stress hormones affects the main cell subpopulations of the immune system. They also offer the prospect of, someday, being able to manipulate stress-hormone levels to improve patients' recovery from surgery or wounds or their responses to vaccines.
You've heard it a thousand times: Stress is bad for you. And it's certainly true that chronic stress, lasting weeks and months, has deleterious effects including, notably, suppression of the immune response. But short-term stress -- the fight-or-flight response, a mobilization of bodily resources lasting minutes or hours in response to immediate threats -- stimulates immune activity, said lead author Firdaus Dhabhar, PhD, an associate professor of psychiatry and behavioral sciences and member of the Stanford Institute for Immunity, Transplantation and Infection.
And that's a good thing. The immune system is crucial for wound healing and preventing or fighting infection, and both wounds and infections are common risks during chases, escapes and combat.
Working with colleagues at Stanford and two other universities in a study published online June 22 inPsychoneuroendocrinology, Dhabhar showed that subjecting laboratory rats to mild stress caused a massive mobilization of several key types of immune cells into the bloodstream and then onto destinations including the skin and other tissues. This large-scale migration of immune cells, which took place over a time course of two hours, was comparable to the mustering of troops in a crisis, Dhabhar said. He and colleagues had previously shown that a similar immune-cell redistribution in patients experiencing the short-term stress of surgery predicts enhanced postoperative recovery.
In the new study, the investigators were able to show that the massive redistribution of immune cells throughout the body was orchestrated by three hormones released by the adrenal glands, in different amounts and at different times, in response to the stress-inducing event. These hormones are the brain's call-to-arms to the rest of the body, Dhabhar said.
"Mother Nature gave us the fight-or-flight stress response to help us, not to kill us," said Dhabhar, who has been conducting experiments for well over a decade on the effects of the major stress hormones on the immune system. Last summer, Dhabhar received the International Society for Psychoneuroendocrinology's Curt. P. Richter Award for his work in this area, culminating in the new study.
The findings paint a clearer picture of exactly how the mind influences immune activity. "An impala's immune system has no way of knowing that a lion is lurking in the grass and is about to pounce, but its brain does," Dhabhar said. In such situations, it benefits lion and impala alike when pathogen-fighting immune cells are in positions of readiness in such places as the skin and mucous membranes, which are at high risk for damage and consequent infection.
So it makes perfect evolutionary sense that predator/prey activity and other situations in nature, such as dominance challenges and sexual approaches, trigger stress hormones. "You don't want to keep your immune system on high alert at all times," Dhabhar said. "So nature uses the brain, the organ most capable of detecting an approaching challenge, to signal that detection to the rest of the body by directing the release of stress hormones. Without them, a lion couldn't kill, and an impala couldn't escape." The stress hormones not only energize the animals' bodies -- they can run faster, jump higher, bite harder -- but, it turns out, also mobilize the immune troops to prepare for looming trouble.
The response occurs across the animal kingdom, he added. You see pretty much the same pattern of hormone release in a fish that has been picked up out of the water.
The experiments in this study were performed on rats, which Dhabhar subjected to mild stress by confining them (gently, and with full ventilation) in transparent Plexiglas enclosures to induce stress. He drew blood several times over a two-hour period and, for each time point, measured levels of three major hormones -- norepinephrine, epinephrine and corticosterone (the rat analog of cortisol in humans) -- as well as of several distinct immune-cell types in the blood.
What he saw was a pattern of carefully choreographed changes in blood levels of the three hormones along with the movement of many different subsets of immune cells from reservoirs such as the spleen and bone marrow into the blood and, finally, to various "front line" organs.
To show that specific hormones were responsible for movements of specific cell types, Dhabhar administered the three hormones, separately or in various combinations, to rats whose adrenal glands had been removed so they couldn't generate their own stress hormones. When the researchers mimicked the pattern of stress-hormone release previously observed in the confined rats, the same immune-cell migration patterns emerged in the rats without adrenal glands. Placebo treatment produced no such effect.
The general pattern, Dhabhar said, was that norepinephrine is released early and is primarily involved in mobilizing all major immune-cell types -- monocytes, neutrophils and lymphocytes -- into the blood. Epinephrine, also released early, mobilized monocytes and neutrophils into the blood, while nudging lymphocytes out into "battlefield" destinations such as skin. And corticosterone, released somewhat later, caused virtually all immune cell types to head out of circulation to the "battlefields."
The overall effect of these movements is to bolster immune readiness. A study published by Dhabhar and his colleagues in 2009 in the Journal of Bone and Joint Surgery assessed patients' recovery from surgery as a function of their immune-cell redistribution patterns during the stress of the operation. Those patients in whom the stress of surgery mobilized immune-cell redistributions similar to those seen in the confined rats in the new study did significantly better afterward than patients whose stress hormones less adequately guided immune cells to appropriate destinations.
The mechanisms Dhabhar has delineated could lead to medical applications, such as administering low doses of stress hormones or drugs that mimic or antagonize them in order to optimize patients' immune readiness for procedures such as surgery or vaccination. "More study will be required including in human subjects, which we hope to conduct, before these applications can be attempted," Dhabhar said. Closer at hand is the monitoring of patients' stress-hormone levels and immune-cell distribution patterns during surgery to assess their surgical prognosis, or during immunization to predict vaccine effectiveness.
The study was funded by the John D. & Catherine T. MacArthur Foundation, the Dana Foundation, the DeWitt Wallace Foundation, the Carl & Elizabeth Naumann Fund and the National Institutes of Health. The medical school's Department of Psychiatry and Behavioral Sciences also supported this work. Dhabhar's co-authors were statistician Eric Neri at Stanford, and neuroendocrinologists at Ohio State University and Rockefeller University.
------------------
The above story is reprinted from materials provided byStanford University Medical Center. The original article was written by Bruce Goldman.
Stanford University Medical Center (2012, June 21). How stress can boost immune system.ScienceDaily. Retrieved April 15, 2013, from http://www.sciencedaily.com­/releases/2012/06/120621223525.htm

4/13/2013

Daily Stress Takes a Toll on Long-Term Mental Health

"Our emotional responses to the stresses of daily life may predict our long-term mental health, according to a new study published inPsychological Science, a journal of the Association for Psychological Science.
Psychological scientist Susan Charles of the University of California, Irvine and colleagues conducted the study in order to answer a long-standing question: Do daily emotional experiences add up to make the straw that breaks the camel’s back, or do these experiences make us stronger and provide an inoculation against later distress?
Using data from two national surveys, the researchers examined the relationship between daily negative emotions and mental health outcomes ten years later.
Participants’ overall levels of negative emotions predicted psychological distress (e.g., feeling worthless, hopeless, nervous, and/or restless) and diagnosis of an emotional disorder like anxiety or depression a full decade after the emotions were initially measured.
Participants’ negative emotional responses to daily stressors — such as argument or a problem at work or home — predicted psychological distress and self-reported emotional disorder ten years later.
The researchers argue that a key strength of the study was their ability to tap a large, national community sample of participants who spanned a wide age range. The results were based on data from 711 participants, both men and women, who ranged in age from 25 to 74. They were all participants in two national, longitudinal survey studies: Midlife Development in the United States (MIDUS) and National Study of Daily Experiences (NSDE).
According to Charles and her colleagues, these findings show that mental health outcomes aren’t only affected by major life events — they also bear the impact of seemingly minor emotional experiences. The study suggests that chronic nature of these negative emotions in response to daily stressors can take a toll on long-term mental health.
In addition to Charles, co-authors on the study include Jennifer Piazza of California State University, Fullerton; and Jacqueline Mogle, Martin Sliwinski, and David Almeida of Pennsylvania State University."
###
For more information about this study, please contact: Susan T. Charles at scharles@uci.edu.
The APS journal Psychological Science is the highest ranked empirical journal in psychology. For a copy of the article "The Wear and Tear of Daily Stressors on Mental Health" and access to other Psychological Science research findings, please contact Anna Mikulak at 202-293-9300 oramikulak@psychologicalscience.org.

5/21/2012

Stressed Men Are More Social


Freiburg researchers have refuted the common belief that stress always causes aggressive behavior. A team of researchers led by the psychologists and neuroscientists Prof. Markus Heinrichs and Dr. Bernadette von Dawans at the University of Freiburg, Germany, examined in a study how men react in stressful situations -- and have refuted a nearly 100-year-old doctrine with their results.

Stressed Men Are More Social

According to this doctrine, humans and most animal species show the "fight-or-flight" response to stress. Only since the late 1990s have some scientists begun to argue that women show an alternate "tend-and-befriend" response to stress -- in other words, a protective ("tend") and friendship-offering ("befriend") reaction. Men, in contrast, were still assumed to become aggressive under stress. Von Dawans refuted this assumption, saying: "Apparently men also show social approach behavior as a direct consequence of stress."

With this study, the research team experimentally investigated male social behavior under stress for the first time. The results are published in the  journal Psychological Science. The economists Prof. Ernst Fehr of the University of Zurich, Switzerland, and Prof. Urs Fischbacher of the University of Konstanz, Germany, as well as the psychologist Prof. Clemens Kirschbaum from the Technical University of Dresden, Germany, also participated in the study. Last year, Heinrichs and von Dawans already developed a standardized procedure for inducing stress in groups using a public speaking task. The researchers examined the implications of this stressor for social behavior using specially designed social interaction games.. These games allowed them to measure positive social behavior -- for example, trust or sharing -- and negative social behavior -- for example, punishment.

In the study, subjects who were under stress showed significantly more positive social behavior than control subjects who were not in a stressful situation. Negative social behavior, on the other hand, was not affected by stress. For Markus Heinrichs, this has far-reaching consequences for our understanding of the social significance of stress: "From previous studies in our laboratory, we already knew that positive social contact with a trusted individual before a stressful situation reduces the stress response. Apparently, this coping strategy is anchored so strongly that people can also change their stress responses during or immediately after the stress through positive social behavior."

Source: Albert-Ludwigs-Universität Freiburg [May 21, 2012]

5/18/2012

Emotionally Intelligent People Are Less Good at Spotting Liars


People who rate themselves as having high emotional intelligence (EI) tend to overestimate their ability to detect deception in others. This is the finding of a paper published in the journal Legal and Criminological Psychology on18 May 2012.

Emotionally Intelligent People Are Less Good at Spotting Liars

Professor Stephen Porter, director of the Centre for the Advancement of Psychological Science and Law at University of British Columbia, Canada, along with colleagues Dr. Leanne ten Brinke and Alysha Baker used a standard questionnaire to measure the EI of 116 participants.

These participants were then asked to view 20 videos from around the world of people pleading for the safe return of a missing family member. In half the videos the person making the plea was responsible for the missing person's disappearance or murder.

The participants were asked to judge whether the pleas were honest or deceptive, say how much confidence they had in their judgements, report the cues they had used to make those judgements and rate their emotional response to each plea.

Professor Porter found that higher EI was associated with overconfidence in assessing the sincerity of the pleas and sympathetic feelings towards people in the videos who turned out to be responsible for the disappearance.

Although EI, in general, was not associated with being better or worse at discriminating between truths and lies, people with a higher ability to perceive and express emotion (a component of EI) were not so good at spotting when people were telling lies.

Professor Porter says: "Taken together, these findings suggest that features of emotional intelligence, and the decision-making processes they lead to, may have the paradoxical effect of impairing people's ability to detect deceit.

"This finding is important because EI is a well-accepted concept and is used in a variety of domains, including the workplace."

Source: British Psychological Society (BPS) [May 18, 2012]

5/16/2012

Are character traits determined genetically?


Genes play a greater role in forming character traits -- such as self-control, decision making or sociability -- than was previously thought, new research suggests.

Are character traits determined genetically?
Genes play a greater role in forming character traits -- such as self-control, decision making or sociability -- than was previously thought, new research suggests [Credit: Web]
A study of more than 800 sets of twins found that genetics were more influential in shaping key traits than a person's home environment and surroundings.

Psychologists at the University of Edinburgh who carried out the study, say that genetically influenced characteristics could well be the key to how successful a person is in life.

The study of twins in the US -- most aged 50 and over- used a series of questions to test how they perceived themselves and others. Questions included "Are you influenced by people with strong opinions?" and "Are you disappointed about your achievements in life?"

The results were then measured according to the Ryff Psychological Well-Being Scale which assesses and standardizes these characteristics.

By tracking their answers, the research team found that identical twins -- whose DNA is [presumed to be] exactly the same -- were twice as likely to share traits compared with non-identical twins.

Psychologists say the findings are significant because the stronger the genetic link, the more likely it is that these character traits are carried through a family.

Professor Timothy Bates, of the University of Edinburgh's School of Philosophy, Psychology and Language Sciences, said that the genetic influence was strongest on a person's sense of self-control.

Researchers found that genes affected a person's sense of purpose, how well they get on with people and their ability to continue learning and developing.

Professor Bates added: "Ever since the ancient Greeks, people have debated the nature of a good life and the nature of a virtuous life. Why do some people seem to manage their lives, have good relationships and cooperate to achieve their goals while others do not? Previously, the role of family and the environment around the home often dominated people's ideas about what affected psychological well-being. However, this work highlights a much more powerful influence from genetics."

The study, which builds on previous research that found that happiness is underpinned by genes, is published online in the Journal of Personality.

Source: University of Edinburgh [May 16, 2013]

5/15/2012

A walk in the park gives mental boost to people with depression


A walk in the park may have psychological benefits for people suffering from depression. In one of the first studies to examine the effect of nature walks on cognition and mood in people with major depression, researchers in Canada and the U.S. have found promising evidence that a walk in the park may provide some cognitive benefits.


The study was led by Marc Berman, a post-doctoral fellow at Baycrest's Rotman Research Institute in Toronto, with partners from the University of Michigan and Stanford University. It is published online this week, ahead of print publication, in the Journal of Affective Disorders.

"Our study showed that participants with clinical depression demonstrated improved memory performance after a walk in nature, compared to a walk in a busy urban environment," said Dr. Berman, who cautioned that such walks are not a replacement for existing and well-validated treatments for clinical depression, such as psychotherapy and drug treatment.

"Walking in nature may act to supplement or enhance existing treatments for clinical depression, but more research is needed to understand just how effective nature walks can be to help improve psychological functioning," he said.

Dr. Berman's research is part of a cognitive science field known as Attention Restoration Theory (ART) which proposes that people concentrate better after spending time in nature or looking at scenes of nature. The reason, according to ART, is that people interacting with peaceful nature settings aren't bombarded with external distractions that relentlessly tax their working memory and attention systems. In nature settings, the brain can relax and enter a state of contemplativeness that helps to restore or refresh those cognitive capacities.

In a research paper he published in 2008 in Psychological Science, Dr. Berman showed that adults who were not diagnosed with any illness received a mental boost after an hour-long walk in a woodland park – improving their performance on memory and attention tests by 20 percent – compared to an hour-long stroll in a noisy urban environment. The findings were reported by The Wall Street Journal, The Boston Globe, The New York Times, and in the Pulitzer Prize finalist book by Nicholas Carr, The Shallows: What the internet is doing to our brains.

In this latest study, Dr. Berman and his research team explored whether a nature walk would provide similar cognitive benefits, and also improve mood for people with clinical depression. Given that individuals with depression are characterized by high levels of rumination and negative thinking, the researchers were skeptical at the outset of the study that a solitary walk in the park would provide any benefit at all and may end up worsening memory and exacerbating depressed mood.

For the study, 20 individuals were recruited from the University of Michigan and surrounding Ann Arbor area; all had a diagnosis of clinical depression. The 12 females and eight males (average age 26) participated in a two-part experiment that involved walking in a quiet nature setting and in a noisy urban setting.

Prior to the walks, participants completed baseline testing to determine their cognitive and mood status. Before beginning a walk, the participants were asked to think about an unresolved, painful autobiographical experience. They were then randomly assigned to go for an hour-long walk in the Ann Arbor Arboretum (woodland park) or traffic heavy portions of downtown Ann Arbor. They followed a prescribed route and wore a GPS watch to ensure compliance.

After completing their walk, they completed a series of mental tests to measure their attention and short-term/working memory and were re-asssessed for mood. A week later the participants repeated the entire procedure, walking in the location that was not visited in the first session.

Participants exhibited a 16 percent increase in attention and working memory after the nature walk relative to the urban walk. Interestingly, interacting with nature did not alleviate depressive mood to any noticeable degree over urban walks, as negative mood decreased and positive mood increased after both walks to a significant and equal extent. Dr. Berman says this suggests that separate brain mechanisms may underlie the cognitive and mood changes of interacting with nature. 

Source: Baycrest Centre for Geriatric Care [May 14, 2012]

5/09/2012

Psychologists reveal how emotion can shut down high-level mental processes without our knowledge


Psychologists at Bangor University believe that they have glimpsed for the first time, a process that takes place deep within our unconscious brain, where primal reactions interact with higher mental processes. Writing in the Journal of Neuroscience, they identify a reaction to negative language inputs which shuts down unconscious processing.


For the last quarter of a century, psychologists have been aware of, and fascinated by the fact that our brain can process high-level information such as meaning outside consciousness. What the psychologists at Bangor University have discovered is the reverse- that our brain can unconsciously 'decide' to withhold information by preventing access to certain forms of knowledge.

The psychologists extrapolate this from their most recent findings working with bilingual people. Building on their previous discovery that bilinguals subconsciously access their first language when reading in their second language; the psychologists at the School of Psychology and Centre for Research on Bilingualism have now made the surprising discovery that our brain shuts down that same unconscious access to the native language when faced with a negative word such as war, discomfort, inconvenience, and unfortunate.

They believe that this provides the first proven insight to a hither-to unproven process in which our unconscious mind blocks information from our conscious mind or higher mental processes.

This finding breaks new ground in our understanding of the interaction between emotion and thought in the brain. Previous work on emotion and cognition has already shown that emotion affects basic brain functions such as attention, memory, vision and motor control, but never at such a high processing level as language and understanding.

Key to this is the understanding that people have a greater reaction to emotional words and phrases in their first language- which is why people speak to their infants and children in their first language despite living in a country which speaks another language and despite fluency in the second. It has been recognised for some time that anger, swearing or discussing intimate feelings has more power in a speaker's native language. In other words, emotional information lacks the same power in a second language as in a native language.

Dr Yan Jing Wu of the University's School of Psychology said: "We devised this experiment to unravel the unconscious interactions between the processing of emotional content and access to the native language system. We think we've identified, for the first time, the mechanism by which emotion controls fundamental thought processes outside consciousness.

"Perhaps this is a process that resembles the mental repression mechanism that people have theorised about but never previously located."

So why would the brain block access to the native language at an unconscious level?

Professor Guillaume Thierry explains: "We think this is a protective mechanism. We know that in trauma for example, people behave very differently. Surface conscious processes are modulated by a deeper emotional system in the brain. Perhaps this brain mechanism spontaneously minimises negative impact of disturbing emotional content on our thinking, to prevent causing anxiety or mental discomfort."

He continues: "We were extremely surprised by our finding. We were expecting to find modulation between the different words- and perhaps a heightened reaction to the emotional word -- but what we found was the exact opposite to what we expected- a cancellation of the response to the negative words."

The psychologists made this discovery by asking English-speaking Chinese people whether word pairs were related in meaning. Some of the word pairs were related in their Chinese translations. Although not consciously acknowledging a relation, measurements of electrical activity in the brain revealed that the bilingual participants were unconsciously translating the words. However, uncannily, this activity was not observed when the English words had a negative meaning.

Source: Bangor University [May 08, 2012]

5/08/2012

'Losing yourself' in a fictional character can affect your real life


When you "lose yourself" inside the world of a fictional character while reading a story, you may actually end up changing your own behavior and thoughts to match that of the character, a new study suggests.


Researchers at Ohio State University examined what happened to people who, while reading a fictional story, found themselves feeling the emotions, thoughts, beliefs and internal responses of one of the characters as if they were their own - a phenomenon the researchers call "experience-taking."

They found that, in the right situations, experience-taking may lead to real changes, if only temporary, in the lives of readers.

In one experiment, for example, the researchers found that people who strongly identified with a fictional character who overcame obstacles to vote were significantly more likely to vote in a real election several days later.

"Experience-taking can be a powerful way to change our behavior and thoughts in meaningful and beneficial ways," said Lisa Libby, co-author of the study and assistant professor of psychology at Ohio State University.

There are many ways experience-taking can affect readers.

In another experiment, people who went through this experience-taking process while reading about a character who was revealed to be of a different race or sexual orientation showed more favorable attitudes toward the other group and were less likely to stereotype.

"Experience-taking changes us by allowing us to merge our own lives with those of the characters we read about, which can lead to good outcomes," said Geoff Kaufman, who led the study as a graduate student at Ohio State. He is now a postdoctoral researcher at the Tiltfactor Laboratory at Dartmouth College.

Their findings appear online in the Journal of Personality and Social Psychology and will be published in a future print edition.

Experience-taking doesn't happen all the time. It only occurs when people are able, in a sense, to forget about themselves and their own self-concept and self-identity while reading, Kaufman said. In one experiment, for example, the researchers found that most college students were unable to undergo experience-taking if they were reading in a cubicle with a mirror.

"The more you're reminded of your own personal identity, the less likely you'll be able to take on a character's identity," Kaufman said.

"You have to be able to take yourself out of the picture, and really lose yourself in the book in order to have this authentic experience of taking on a character's identity."

In the voting study, 82 undergraduates who were registered and eligible to vote were assigned to read one of four versions of a short story about a student enduring several obstacles on the morning of Election Day (such as car problems, rain, long lines) before ultimately entering the booth to cast a vote. This experiment took place several days before the 2008 November presidential election.

Some versions were written in first person ("I entered the voting booth) while some were written in third person ("Paul entered the voting booth"). In addition, some versions featured a student who attended the same university as the participants, while in other versions, the protagonist in the story attended a different university.

After reading the story, the participants completed a questionnaire that measured their level of experience-taking - how much they adopted the perspective of the character in the story. For example, they were asked to rate how much they agreed with statements like "I found myself feeling what the character in the story was feeling" and "I felt I could get inside the character's head."

The results showed that participants who read a story told in first-person, about a student at their own university, had the highest level of experience-taking. And a full 65 percent of these participants reported they voted on Election Day, when they were asked later.

In comparison, only 29 percent of the participants voted if they read the first-person story about a student from a different university.

"When you share a group membership with a character from a story told in first-person voice, you're much more likely to feel like you're experiencing his or her life events," Libby said. "And when you undergo this experience-taking, it can affect your behavior for days afterwards."

While people are more likely to lose themselves in a character who is similar to themselves, what happens if they don't learn that a character is not similar until later in a story?

In one experiment, 70 male, heterosexual college students read a story about a day in the life of another student. There were three versions - one in which the character was revealed to be gay early in the story, one in which the student was identified as gay late in the story, and one in which the character was heterosexual.

Results showed that the students who read the story where the character was identified as gay late in the narrative reported higher levels of experience-taking than did those who read the story where the character's homosexuality was announced early.

"If participants knew early on that the character was not like them - that he was gay - that prevented them from really experience-taking," Libby said.

"But if they learned late about the character's homosexuality, they were just as likely to lose themselves in the character as were the people who read about a heterosexual student."

Even more importantly, the version of the story participants read affected how they thought about gays.

Those who read the gay-late narrative reported significantly more favorable attitudes toward homosexuals after reading the story than did readers of both the gay-early narrative and the heterosexual narrative.

Those who read the gay-late narrative also relied less on stereotypes of homosexuals - they rated the gay character as less feminine and less emotional than did the readers of the gay-early story.

"If people identified with the character before they knew he was gay, if they went through experience-taking, they had more positive views - the readers accepted that this character was like them," Kaufman said.

Similar results were found in a story where white students read about a black student, who was identified as black early or late in the story.

Libby said experience-taking is different from perspective-taking, where people try to understand what another person is going though in a particular situation - but without losing sight of their own identity.

"Experience-taking is much more immersive - you've replaced yourself with the other," she said.

The key is that experience-taking is spontaneous - you don't have to direct people to do it, but it happens naturally under the right circumstance.

"Experience-taking can be very powerful because people don't even realize it is happening to them. It is an unconscious process," Libby said.

Author: Jeff Grabmeier | Source: Ohio State University [May 07, 2012]

5/01/2012

Dopamine impacts your willingness to work


Everyone knows that people vary substantially in how hard they are willing to work, but the origin of these individual differences in the brain remains a mystery.

High levels of dopamine activity, shown in orange and yellow, were found in the striatum (center) and ventromedial prefrontal cortex (right) in the brains of "go getters" [Credit: Zald Lab, Vanderbilt University]
Now the veil has been pushed back by a new brain imaging study that has found an individual's willingness to work hard to earn money is strongly influenced by the chemistry in three specific areas of the brain. In addition to shedding new light on how the brain works, the research could have important implications for the treatment of attention-deficit disorder, depression, schizophrenia and other forms of mental illness characterized by decreased motivation.

The study was published May 2 in the Journal of Neuroscience and was performed by a team of Vanderbilt scientists including post-doctoral student Michael Treadway and Professor of Psychology David Zald.

Using a brain mapping technique called positron emission tomography (PETscan), the researchers found that "go-getters" who are willing to work hard for rewards had higher release of the neurotransmitter dopamine in areas of the brain known to play an important role in reward and motivation, the striatum and ventromedial prefrontal cortex. On the other hand, "slackers" who are less willing to work hard for a reward had high dopamine levels in another brain area that plays a role in emotion and risk perception, the anterior insula.

"Past studies in rats have shown that dopamine is crucial for reward motivation," said Treadway, "but this study provides new information about how dopamine determines individual differences in the behavior of human reward-seekers."

The role of dopamine in the anterior insula came as a complete surprise to the researchers. The finding was unexpected because it suggests that more dopamine in the insula is associated with a reduced desire to work, even when it means earning less money. The fact that dopamine can have opposing effects in different parts of the brain complicates the picture regarding the use of psychotropic medications that affect dopamine levels for the treatment of attention-deficit disorder, depression and schizophrenia because it calls into question the general assumption that these dopaminergic drugs have the same effect throughout the brain.

The study was conducted with 25 healthy volunteers (52 percent female) ranging in age from 18 to 29. To determine their willingness to work for a monetary reward, the participants were asked to perform a button-pushing task. First, they were asked to select either an easy or a hard button-pushing task. Easy tasks earned $1 while the reward for hard tasks ranged up to $4. Once they made their selection, they were told they had a high, medium or low probability of getting the reward. Individual tasks lasted for about 30 seconds and participants were asked to perform them repeatedly for about 20 minutes.

"At this point, we don't have any data proving that this 20-minute snippet of behavior corresponds to an individual's long-term achievement," said Zald, "but if it does measure a trait variable such as an individual's willingness to expend effort to obtain long-term goals, it will be extremely valuable."

The research is part of a larger project designed to search for objective measures for depression and other psychological disorders where motivation is reduced. "Right now our diagnoses for these disorders is often fuzzy and based on subjective self-report of symptoms," said Zald. "Imagine how valuable it would be if we had an objective test that could tell whether a patient was suffering from a deficit or abnormality in an underlying neural system. With objective measures we could treat the underlying conditions instead of the symptoms."

Further research is needed to examine whether similar individual differences in dopamine levels help explain the altered motivation seen in forms of mental illness such as depression and addiction. Additional research is under way to examine how medications specifically impact these motivational systems. 

Source: Vanderbilt University [May 01, 2012]

The bright side of death: Awareness of mortality can result in positive behaviors


Contemplating death doesn't necessarily lead to morose despondency, fear, aggression or other negative behaviors, as previous research has suggested. Following a review of dozens of studies, University of Missouri researchers found that thoughts of mortality can lead to decreased militaristic attitudes, better health decisions, increased altruism and helpfulness, and reduced divorce rates.


"According to terror management theory, people deal with their awareness of mortality by upholding cultural beliefs and seeking to become part of something larger and more enduring than themselves, such as nations or religions," said Jamie Arndt, study co-author and professor of psychological sciences. "Depending on how that manifests itself, positive outcomes can be the result."

For example, in one study American test subjects were reminded of death or a control topic and then either imagined a local catastrophe or were reminded of the global threat of climate change. Their militaristic attitudes toward Iran were then evaluated. After being reminded of death, people who were reminded of climate change were more likely to express lower levels of militarism than those who imagined a local disaster.

"The differences seen in this study resulted from the size of the group with which the test subjects identified," said Ken Vail, lead author and psychology doctoral student. "In both cases, they responded to the awareness of mortality by seeking to protect the relevant groups. When the threat was localized, subjects aggressively defended their local group; but when the threat was globalized, subjects associated themselves with humanity as a whole and became more peaceful and cooperative."

After real catastrophes, such as the terrorist attacks of 9/11 and the Oklahoma City bombing, people's heightened fear and awareness of death had both positive and negative effects.

"Both the news media and researchers tended to focus on the negative reaction to these acts of terrorism, such as violence and discrimination against Muslims, but studies also found that people expressed higher degrees of gratitude, hope, kindness and leadership after 9/11." Vail said. "In another example, after the Oklahoma City bombing, divorce rates went down in surrounding counties. After some stimuli escalates one's awareness of death, the positive reaction is to try to reaffirm that the world has positive aspects as well."

In their personal lives, people also were influenced to make positive choices after their awareness of death was increased. Studies found that conscious thoughts of death can inspire intentions to exercise more. Other studies found that keeping mortality in mind can reduce smoking and increase sunscreen use.

Even subconscious awareness of death can more influenced behavior. In one experiment, passers-by who had recently overheard conversations mentioning the value of helping were more likely to help strangers if they were walking within sight of cemeteries.

"Once we started developing this study we were surprised how much research showed positive outcomes from awareness of mortality," said Arndt. "It seems that people may be just as capable of doing the opposite and 'looking on the bright side of death,' as the Monty Python song says."

Source: University of Missouri-Columbia [April 30, 2012]

Highly Religious People Are Less Motivated by Compassion Than Are Non-Believers


"Love thy neighbor" is preached from many a pulpit. But new research from the University of California, Berkeley, suggests that the highly religious are less motivated by compassion when helping a stranger than are atheists, agnostics and less religious people.


In three experiments, social scientists found that compassion consistently drove less religious people to be more generous. For highly religious people, however, compassion was largely unrelated to how generous they were, according to the findings which are published in the most recent online issue of the journal Social Psychological and Personality Science.

The results challenge a widespread assumption that acts of generosity and charity are largely driven by feelings of empathy and compassion, researchers said. In the study, the link between compassion and generosity was found to be stronger for those who identified as being non-religious or less religious.

"Overall, we find that for less religious people, the strength of their emotional connection to another person is critical to whether they will help that person or not," said UC Berkeley social psychologist Robb Willer, a co-author of the study. "The more religious, on the other hand, may ground their generosity less in emotion, and more in other factors such as doctrine, a communal identity, or reputational concerns."

Compassion is defined in the study as an emotion felt when people see the suffering of others which then motivates them to help, often at a personal risk or cost.

While the study examined the link between religion, compassion and generosity, it did not directly examine the reasons for why highly religious people are less compelled by compassion to help others. However, researchers hypothesize that deeply religious people may be more strongly guided by a sense of moral obligation than their more non-religious counterparts.

"We hypothesized that religion would change how compassion impacts generous behavior," said study lead author Laura Saslow, who conducted the research as a doctoral student at UC Berkeley.

Saslow, who is now a postdoctoral scholar at UC San Francisco, said she was inspired to examine this question after an altruistic, nonreligious friend lamented that he had only donated to earthquake recovery efforts in Haiti after watching an emotionally stirring video of a woman being saved from the rubble, not because of a logical understanding that help was needed.

"I was interested to find that this experience -- an atheist being strongly influenced by his emotions to show generosity to strangers -- was replicated in three large, systematic studies," Saslow said.

In the first experiment, researchers analyzed data from a 2004 national survey of more than 1,300 American adults. Those who agreed with such statements as "When I see someone being taken advantage of, I feel kind of protective towards them" were also more inclined to show generosity in random acts of kindness, such as loaning out belongings and offering a seat on a crowded bus or train, researchers found.

When they looked into how much compassion motivated participants to be charitable in such ways as giving money or food to a homeless person, non-believers and those who rated low in religiosity came out ahead: "These findings indicate that although compassion is associated with pro-sociality among both less religious and more religious individuals, this relationship is particularly robust for less religious individuals," the study found.

In the second experiment, 101 American adults watched one of two brief videos, a neutral video or a heartrending one, which showed portraits of children afflicted by poverty. Next, they were each given 10 "lab dollars" and directed to give any amount of that money to a stranger. The least religious participants appeared to be motivated by the emotionally charged video to give more of their money to a stranger.

"The compassion-inducing video had a big effect on their generosity," Willer said. "But it did not significantly change the generosity of more religious participants."

In the final experiment, more than 200 college students were asked to report how compassionate they felt at that moment. They then played "economic trust games" in which they were given money to share -- or not -- with a stranger. In one round, they were told that another person playing the game had given a portion of their money to them, and that they were free to reward them by giving back some of the money, which had since doubled in amount.

Those who scored low on the religiosity scale, and high on momentary compassion, were more inclined to share their winnings with strangers than other participants in the study.

"Overall, this research suggests that although less religious people tend to be less trusted in the U.S., when feeling compassionate, they may actually be more inclined to help their fellow citizens than more religious people," Willer said.

In addition to Saslow and Willer, other co-authors of the study are UC Berkeley psychologists Dacher Keltner, Matthew Feinberg and Paul Piff; Katharine Clark at the University of Colorado, Boulder; and Sarina Saturn at Oregon State University.

The study was funded by grants from UC Berkeley's Greater Good Science Center, UC Berkeley's Center for the Economics and Demography of Aging, and the Metanexus Institute.

Author: Yasmin Anwar | Source: University of California - Berkeley [April 30, 2012]

4/28/2012

Big Girls Don’t Cry


A study to be published in the June 2012 issue of Journal of Adolescent Health looking at the relationships between body satisfaction and healthy psychological functioning in overweight adolescents has found that young women who are happy with the size and shape of their bodies report higher levels of self-esteem. They may also be protected against the negative behavioral and psychological factors sometimes associated with being overweight.


A group of 103 overweight adolescents were surveyed between 2004 and 2006, assessing body satisfaction, weight-control behavior, importance placed on thinness, self-esteem and symptoms of anxiety and depression, among other factors.

"We found that girls with high body satisfaction had a lower likelihood of unhealthy weight-control behaviors like fasting, skipping meals or vomiting," said Kerri Boutelle, PhD, associate professor of psychiatry and pediatrics at the University of California, San Diego School of Medicine. Boutelle added that the positive relationship shown in this study between body a girl's happiness with her body and her behavioral and psychological well-being suggests that improving body satisfaction could be a key component of interventions for overweight youth.

"A focus on enhancing self-image while providing motivation and skills to engage in effect weight-control behaviors may help protect young girls from feelings of depression, anxiety or anger sometimes association with being overweight," said Boutelle.

Additional contributors included first author Taya R. Cromley, PhD, of UCLA; Stephanie Knatz and Roxanne Rockwell, UC San Diego; and Dianne Neumark-Sztainer, PhD, MPH, RD and Mary Story, PhD, RD, University of Minnesota, Minneapolis.

This study was supported by a University of Minnesota Children's Vikings Grant.

Source: University of California, San Diego Health Sciences [April 27, 2012]

Analytic Thinking Can Decrease Religious Belief


A new University of British Columbia study finds that analytic thinking can decrease religious belief, even in devout believers. The study, which is published in the April 27 issue of Science, finds that thinking analytically increases disbelief among believers and skeptics alike, shedding important new light on the psychology of religious belief.


“Our goal was to explore the fundamental question of why people believe in a God to different degrees,” says lead author Will Gervais, a PhD student in UBC’s Dept. of Psychology. “A combination of complex factors influence matters of personal spirituality, and these new findings suggest that the cognitive system related to analytic thoughts is one factor that can influence disbelief.”

Researchers used problem-solving tasks and subtle experimental priming – including showing participants Rodin’s sculpture The Thinker or asking participants to complete questionnaires in hard-to-read fonts – to successfully produce “analytic” thinking. The researchers, who assessed participants’ belief levels using a variety of self-reported measures, found that religious belief decreased when participants engaged in analytic tasks, compared to participants who engaged in tasks that did not involve analytic thinking.

The findings, Gervais says, are based on a longstanding human psychology model of two distinct, but related cognitive systems to process information: an “intuitive” system that relies on mental shortcuts to yield fast and efficient responses, and a more “analytic” system that yields more deliberate, reasoned responses.

“Our study builds on previous research that links religious beliefs to ‘intuitive’ thinking,” says study co-author and Associate Prof. Ara Norenzayan, UBC Dept. of Psychology. “Our findings suggest that activating the ‘analytic’ cognitive system in the brain can undermine the ‘intuitive’ support for religious belief, at least temporarily.”

The study involved more than 650 participants in the U.S. and Canada. Gervais says future studies will explore whether the increase in religious disbelief is temporary or long-lasting, and how the findings apply to non-Western cultures.

Recent figures suggest that the majority of the world’s population believes in a God, however atheists and agnostics number in the hundreds of millions, says Norenzayan, a co-director of UBC’s Centre for Human Evolution, Cognition and Culture. Religious convictions are shaped by psychological and cultural factors and fluctuate across time and situations, he says.

Source: University of British Columbia [April 26, 2012]

4/11/2012

Teamwork linked to intelligence


Learning to work in teams may explain why humans evolved a bigger brain, according to a new study published on Wednesday. 

Learning to work in teams may explain why humans evolved a bigger brain [Credit: AFP]
Compared to his hominid predecessors, Homo sapiens is a cerebral giant, a riddle that scientists have long tried to solve. 

The answer, according to researchers in Ireland and Scotland, may lie in social interaction. 

Working with others helped Man to survive, but he had to develop a brain big enough to cope with all the social complexities, they believe. 

In a computer model, the team simulated the human brain, allowing a network of neurons to evolve in response to a series of social challenges. 

There were two scenarios. The first entailed two partners in crime who had been caught by the police, each having to decide whether or not to inform on the other. 

The second had two individuals trapped in a car in a snowdrift and having to weigh whether to cooperate to dig themselves out or just sit back and let the other do it. 

In both cases, the individual would gain more from selfishness. 

But the researchers were intrigued to find that as the brain evolved, the individual was likelier to choose to cooperate. 

"We cooperate in large groups of unrelated individuals quite frequently, and that requires cognitive abilities to keep track of who is doing what to you and change your behaviour accordingly," co-author Luke McNally of Dublin's Trinity College told AFP. 

McNally pointed out, though, that cooperation has a calculating side. We do it out of reciprocity. 

"If you cooperate and I cheat, then next time we interact you could decide: 'Oh well, he cheated last time, so I won't cooperate with him.' So basically you have to cooperate in order to receive cooperation in the future." 

McNally said teamwork and bigger brainpower fed off each other. 

"Transitions to cooperative, complex societies can drive the evolution of a bigger brain," he said. 

"Once greater levels of intelligence started to evolve, you saw cooperation going much higher." 

The study appears in Proceedings of the Royal Society B, a journal published by Britain's de-facto academy of sciences. 

Commenting on the paper, Robin Dunbar, an evolutionary anthropologist at Oxford University, said the findings were a valuable add to understanding brain evolution. 

But he said there were physiological limits to cooperation. 

Man would need a "house-sized brain" to take cooperation to a perfect level on a planet filled with humans. 

"Our current brain size limits the community size that we can manage ... that we feel we belong to," he said. 

Our comfortable "personal social network" is limited to about 150, and boosting that to 500 would require a doubling of the size of the brain. 

"In order to create greater social integration, greater social cohesion even on the size of France, never mind the size of the EU, never mind the planet, we probably have to find other ways of doing it" than wait for evolution, said Dunbar. 

Source: AFP [April 11, 2012]

Do I look bigger with my finger on a trigger? Yes, says study


UCLA anthropologists asked hundreds of Americans to guess the size and muscularity of four men based solely on photographs of their hands holding a range of easily recognizable objects, including handguns. 

Photo from study. Holding a gun like this makes a man appear taller and stronger than he would otherwise, UCLA anthropologists have found [Credit: Daniel Fessler/UCLA]
The research, which publishes April 11 in the scholarly journal PLoS ONE, confirms what scrawny thugs have long known: Brandishing a weapon makes a man appear bigger and stronger than he would otherwise. 

"There's nothing about the knowledge that gun powder makes lead bullets fly through the air at damage-causing speeds that should make you think that a gun-bearer is bigger or stronger, yet you do," said Daniel Fessler, the lead author of the study and an associate professor of anthropology at UCLA. "Danger really does loom large -- in our minds." 

Researchers say the findings suggest an unconscious mental mechanism that gauges a potential adversary and then translates the magnitude of that threat into the same dimensions used by animals to size up their adversaries: size and strength. 

"We've isolated a capacity to assess threats in a simple way," said Colin Holbrook, a UCLA postdoctoral scholar in anthropology and co-author of the study. "Though this capacity is very efficient, it can misguide us." 

The study is part of larger project funded by the U.S. Air Force Office of Scientific Research to understand how people make decisions in situations where violent conflict is a possibility. The findings are expected to have ramifications for law enforcement, prison guards and the military. 

"We're exploring how people think about the relative likelihood that they will win a conflict, and then how those thoughts affect their decisions about whether to enter into conflict," said Fessler, whose research focuses on the biological and cultural bases of human behavior. He is the director of UCLA's Center for Behavior, Evolution and Culture, an interdisciplinary group of researchers who explore how various forms of evolution shape behavior. 

For the study, the UCLA researchers recruited participants in multiple rounds using classified advertisements on the websites Craigslist and MechanicalTurk. In one round, 628 individuals were asked to look at four pictures of different hands, each holding a single object: a caulking gun, electric drill, large saw or handgun. 

"Tools were used as control objects to rule out the possibility that a simple link with traditionally masculine objects would explain intuitions that the weapon-holders were larger and stronger," Fessler explained. 

The individuals were then asked to estimate the height of each hand model in feet and inches based solely on the photographs of their hands. Participants also were shown six images of progressively taller men and six images of progressively more muscular men and asked to estimate which image came closest to the probable size and strength of the hand model. 

Study participants consistently judged pistol-packers to be taller and stronger than the men holding the other objects, even though the experiment's four hand models were recruited on the basis of their equivalent hand size and similar hand appearance (white and without identifying marks such as tattoos or scars). 

To rule out the possibility that a feature of any one hand might influence the estimates, researchers had taken separate pictures of each hand holding each object -- some participants saw the gun held by one hand model, others saw the same gun held by another model, and so on; they did the same thing for each of the objects. The researchers also shuffled the order in which the photos were presented. 

On average, participants judged pistol packers to be 17 percent taller and stronger than those judged to be the smallest and weakest men -- the ones holding caulking guns. Hand models holding the saw and drill followed gun-wielders in size and strength. 

"The function of the system is to provide an easy way for people to assess the likelihood that they would win or lose in a conflict," said Jeffrey K. Snyder, a UCLA graduate student in anthropology and a study co-author. 

Concerned that their findings might be influenced by popular culture, which often depicts gun-slingers as big and strong men, the team conducted two more studies using objects that did not seem to have a macho image: a kitchen knife, a paint brush and a large, brightly colored toy squirt gun. In the initial round, a new group of 100 subjects was asked to evaluate the danger posed by each of the objects (which were presented alone, without hands holding them). They then were asked to pick the type of person most associated with the object: a child, a woman or a man. 

Not surprisingly, individuals rated the knife most dangerous, followed by the paint brush and squirt gun. But where the most lethal object in the earlier studies -- the handgun -- would likely have been associated with men, participants in this study most often associated the most lethal object -- the kitchen knife -- with women. The paint brush was most often associated with men, and the squirt gun with children. 

In the final round of tests, a new group of 541 individuals was shown male hands holding the knife, paint brush and squirt gun and was then asked to estimate the height and muscularity of the hand models. Once again, men holding the most lethal object -- in this case, the kitchen knife -- were judged to be the biggest and strongest, followed by those holding the paint brush and the squirt gun. 

"It's not Dirty Harry's or Rambo's handgun -- it's just a kitchen knife, but it's still deadly," Holbrook said. "And our study subjects responded accordingly, estimating its holder to be bigger and stronger than the rest." 

Author: Meg Sullivan | Source: University of California - Los Angeles [April 11, 2012]

4/04/2012

Does religious faith lead to greater rewards here on Earth?


Delayed gratification: People who are good at overcoming their immediate impulses to take small rewards now — in favor of larger rewards down the road — do better in many areas of life, including academic achievement, income, job performance and health. What life experiences develop this ability? A new study published online, ahead of print, by the journal of Evolution and Human Behavior, finds that religious people are better able to forgo immediate satisfaction in order to gain larger rewards in the future. The study is the first to demonstrate an association between religious commitment and a stronger preference for delayed, but more significant, rewards. 


"It's possible to analyze virtually all contemporary social concerns, from excessive credit card debt to obesity, as problems of impulsivity. So the fact that religious people tend to be less impulsive has implications for the sorts of decisions they make with their money, time, and other resources," says Michael McCullough, professor of Psychology in the College of Arts and Sciences at the University of Miami (UM), and principal investigator of this study. "Their tendency toward less impulsive decision-making might even be relevant to their stands on public policy issues, such as whether governments should be seeking to reduce their expenditures on public services and entitlement programs in the current economic environment." 

In the research work, titled "Religious people discount the future less," 277 undergraduate University students, from a variety of religious denominations and ethnic backgrounds, chose between receiving a small financial monetary reward that the investigators made available immediately--for example "$50 today," or a larger reward that was available only after longer amounts of time had passed—for example, "$100 six months from now." Participants' commitment to their religious beliefs and institutions was also measured, among other relevant variables. The data shows that the extent to which the participants follow religious teachings positively correlates with their ability to delay gratification. 

The findings suggest that through religious beliefs and practices, people "develop a more patient style of decision making." According to the study, religion teaches this type of patience by directing people's attention to the distant future—the afterlife—which may cause their nearer-term future on this earth to feel subjectively closer. 

"People who are intrinsically religious and who indicate an interest in the afterlife tend to report that the future feels as though it is approaching quickly and that they spend a lot of time thinking about the future," the study says. 

Source: University of Miami [April 04, 2012]

Keep aging brains sharp


Exercising, eating a healthy diet and playing brain games may help you keep your wits about you well into your 80s and even 90s, advises a new book by researchers at George Mason University. 


"These are all cheap, easy things to do," says Pamela Greenwood, an associate professor in the Department of Psychology on Mason's Fairfax, Va. campus. "We should all be doing them anyway. You should do them for your heart and health, so why not do them for your brain as well?" 

For the past 20 years, Greenwood and Raja Parasuraman, University Professor of Psychology, have studied how the mind and brain age, focusing on Alzheimer's disease. Their book, "Nurturing the Older Brain and Mind" published by MIT Press, came out in March. The cognitive neuroscientists geared the book to middle-aged readers who want to keep their mental snap. 

"We know that if we can put off dementing illnesses even by a year or two through lifestyle changes, that will reduce the number of people with Alzheimer's disease, which is reaching epidemic proportions," Parasuraman says. 

Not everyone's brain declines when retirement age hits. "You can look at a group of 65-year-olds — some are in nursing homes, and some are running the world," Greenwood says. 

Now that more workers are staying on the job longer for economic reasons and because countries are upping the retirement age, keeping the mind agile becomes paramount, Parasuraman says. 

For the book, Parasuraman and Greenwood examined only scientific studies, theirs and others, ranging from neurological to physiological. A few surprises leaped out of the data. 

"Several old dogmas were overturned," Parasuraman says. "There's the tired old joke that we're losing brain cells as we age — maybe starting as young as 20 or 30 — and it's all downhill after that." 

Not so, new research reveals. Not only are some 60-year-olds as sharp as 20-year-olds, but their brains still create new cells. Brain cells may not grow as fast as bone or skin cells, but grow they do, particularly in the hippocampus. "It's the area of the brain that's very important to memory and is affected by Alzheimer's disease," Parasuraman says. 

Novel experiences and new learning help new brain cells become part of the circuitry. Parasuraman points to a study of terminally ill cancer patients whose brains were still forming new neurons. "If a person who's in a terminally ill state can generate new neurons, then surely healthy people can," Parasuraman says. 

Brain games and new experiences may build up "white matter," which insulates neurons as they carry signals, Greenwood says. In older brains, this white matter insulation develops holes and signals go awry. 

Older adult gamers are winning skills to help them move through life, Parasuraman says. "We are looking at everyday problem solving," he says. "Are you better at balancing a checkbook? Are you better at making decisions in a grocery store? We're finding you get better at those tasks (after playing the video games in the study)." 

Moving large muscle groups also builds brain matter. In one study detailed in the book, older, sedentary people began walking or did stretching exercises for 45 minutes, three times a week. "Those people actually became smarter over time," Greenwood says. "You don't have to be running Ironman marathons. You can just walk briskly three or four times a week." 

Another best bet for an active mind is a nutritious diet that limits calories to the minimum amount needed to keep a body healthy. No starvation diets, though. "The strongest evidence we have is not very pleasant, which is dietary restriction, reducing calories," Parasuraman says. "That clearly improves longevity and cognition. The evidence in animals is very strong. Such dietary restriction may never be popular. But perhaps every-other-day fasting as an approximation to it is something people would tolerate: You eat normally one day, and the next day you don't." 

Popping supplements won't fill a nutritionally deficient diet, Parasuraman says. "A lot of people think, 'I can eat junk food and then take a pill.' No. You have to eat fruits and vegetables, leafy vegetables. It has to be part of the regular diet because otherwise it's not absorbed." 

Fat cells help make up cell membranes. The unsaturated fats found in fish and olive oils may boost flexibility in these membranes. The more flexible membranes are, the better they may work, scientists theorize. Saturated fats such as butter have to go because these fats vie with healthy fats for a place in the cell membrane, Greenwood explains. 

Greenwood and Parasuraman want people to know that getting old doesn't mean getting senile. "The bottom line message of the book is really a hopeful one," Greenwood says. "There are lots of things that you can do (to keep your brain healthy)." 

Source: George Mason University [April 04, 2012]

Twitter Delicious Facebook Digg Stumbleupon Favorites More

 
Design by Free WordPress Themes | Bloggerized by Lasantha - Premium Blogger Themes | Facebook Themes